While we have analyzed algorithmic efficiency using Time and Space complexity, algorithms also possess qualitative properties essential for specialized tasks. The property of stability dictates how duplicate elements are handled during the sorting process.
- A sorting algorithm is considered stable if it preserves the relative order of elements that have equal key values. If element $x$ appears before element $y$ in the input array $A$, and they are equal (i.e., $x = y$), then $x$ must still appear before $y$ in the final sorted array $B$.
- Stability is critical for multi-key sorting (e.g., sorting a spreadsheet by Column A, then by Column B). If the sort used for Column B is stable, it guarantees that the ordering established by Column A (among equal values in Column B) is maintained.
- Algorithms like Merge Sort are inherently stable, while others like Quick Sort or Heap Sort are typically unstable because their swap operations do not prioritize maintaining the original indices of duplicates.
| Algorithm | Stability | Time Complexity (Worst) | Space (Auxiliary) |
|---|---|---|---|
| Merge Sort | Stable | $O(n \log n)$ | $O(n)$ |
| Insertion Sort | Stable | $O(n^2)$ | $O(1)$ |
| Bubble Sort | Stable | $O(n^2)$ | $O(1)$ |
| Quick Sort | Unstable* | $O(n^2)$ | $O(\log n)$ |
| Selection Sort | Unstable | $O(n^2)$ | $O(1)$ |
| Heap Sort | Unstable | $O(n \log n)$ | $O(1)$ |